E cient leave-one-out cross-validation of kernel Fisher discriminant classi'ers

نویسندگان

  • Gavin C. Cawley
  • Nicola L.C. Talbot
چکیده

Mika et al. (in: Neural Network for Signal Processing, Vol. IX, IEEE Press, New York, 1999; pp. 41–48) apply the “kernel trick” to obtain a non-linear variant of Fisher’s linear discriminant analysis method, demonstrating state-of-the-art performance on a range of benchmark data sets. We show that leave-one-out cross-validation of kernel Fisher discriminant classi'ers can be implemented with a computational complexity of only O(‘) operations rather than the O(‘) of a na@Ave implementation, where ‘ is the number of training patterns. Leave-one-out cross-validation then becomes an attractive means of model selection in large-scale applications of kernel Fisher discriminant analysis, being signi'cantly faster than conventional k-fold cross-validation procedures commonly used. ? 2003 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient leave-one-out cross-validation of kernel fisher discriminant classifiers

Mika et al. [1] apply the “kernel trick” to obtain a non-linear variant of Fisher’s linear discriminant analysis method, demonstrating state-of-the-art performance on a range of benchmark datasets. We show that leave-one-out cross-validation of kernel Fisher discriminant classifiers can be implemented with a computational complexity of only O(l3) operations rather than the O(l4) of a näıve impl...

متن کامل

Efficient cross-validation of kernel fisher discriminant classifiers

Mika et al. [1] introduce a non-linear formulation of the Fisher discriminant based the well-known “kernel trick”, later shown to be equivalent to the Least-Squares Support Vector Machine [2, 3]. In this paper, we show that the cross-validation error can be computed very efficiently for this class of kernel machine, specifically that leave-one-out cross-validation can be performed with a comput...

متن کامل

Optimally regularised kernel Fisher discriminant classification

Mika, Rätsch, Weston, Schölkopf and Müller [Mika, S., Rätsch, G., Weston, J., Schölkopf, B., & Müller, K.-R. (1999). Fisher discriminant analysis with kernels. In Neural networks for signal processing: Vol. IX (pp. 41-48). New York: IEEE Press] introduce a non-linear formulation of Fisher's linear discriminant, based on the now familiar "kernel trick", demonstrating state-of-the-art performance...

متن کامل

Feature Scaling for Kernel Fisher Discriminant Analysis Using Leave-One-Out Cross Validation

Kernel fisher discriminant analysis (KFD) is a successful approach to classification. It is well known that the key challenge in KFD lies in the selection of free parameters such as kernel parameters and regularization parameters. Here we focus on the feature-scaling kernel where each feature individually associates with a scaling factor. A novel algorithm, named FS-KFD, is developed to tune th...

متن کامل

Rapid and Brief Communication An e$cient renovation on kernel Fisher discriminant analysis and face recognition experiments

A reformative kernel algorithm, which can deal with two-class problems as well as those with more than two classes, on Fisher discriminant analysis is proposed. In the novel algorithm the supposition that in feature space discriminant vector can be approximated by some linear combination of a part of training samples, called “signi6cant nodes”, is made. If the “signi6cant nodes” are found out, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003